Philosophy |
---|
Branches
|
Eras
|
Traditions
|
Philosophers
Aestheticians · Epistemologists
Ethicists · Logicians Metaphysicians Social and political philosophers |
Literature
Aesthetics · Epistemology
Ethics · Logic · Metaphysics Political philosophy |
Lists
|
Portal |
The history of logic is the study of the development of the science of valid inference (logic). While many cultures have employed intricate systems of reasoning, and logical methods are evident in all human thought, an explicit analysis of the principles of reasoning was developed only in three traditions: those of China, India, and Greece. Of these, only the treatment of logic descending from the Greek tradition, particularly Aristotelian logic, found wide application and acceptance in science and mathematics. Logic was known as dialectic or analytic in Ancient Greece.
Aristotle's logic was further developed by Islamic and then medieval European logicians, reaching a high point in the mid-fourteenth century. The period between the fourteenth century and the beginning of the nineteenth century was largely one of decline and neglect, and is generally regarded as barren by historians of logic.[1]
Logic was revived in the mid-nineteenth century, at the beginning of a revolutionary period when the subject developed into a rigorous and formalistic discipline whose exemplar was the exact method of proof used in mathematics. The development of the modern so-called "symbolic" or "mathematical" logic during this period is the most significant in the two-thousand-year history of logic, and is arguably one of the most important and remarkable events in human intellectual history.[2]
Progress in mathematical logic in the first few decades of the twentieth century, particularly arising from the work of Gödel and Tarski, had a significant impact on analytic philosophy and philosophical logic, particularly from the 1950s onwards, in subjects such as modal logic, temporal logic, deontic logic, and relevance logic.
Contents |
Valid reasoning has been employed in all periods of human history. However, logic studies the principles of valid reasoning, inference and demonstration. It is probable that the idea of demonstrating a conclusion first arose in connection with geometry, which originally meant the same as "land measurement".[3] In particular, the ancient Egyptians had empirically discovered some truths of geometry, such as the formula for the volume of a truncated pyramid.[4]
Another origin can be seen in Babylonia. Esagil-kin-apli's medical Diagnostic Handbook in the 11th century BC was based on a logical set of axioms and assumptions,[5] while Babylonian astronomers in the 8th and 7th centuries BC employed an internal logic within their predictive planetary systems, an important contribution to the philosophy of science.[6]
While the ancient Egyptians empirically discovered some truths of geometry, the great achievement of the ancient Greeks was to replace empirical methods by demonstrative science.[4] The systematic study of this seems to have begun with the school of Pythagoras in the late sixth century BC.[4] The three basic principles of geometry are that certain propositions must be accepted as true without demonstration, that all other propositions of the system are derived from these, and that the derivation must be formal, that is, independent of the particular subject matter in question.[4] Fragments of early proofs are preserved in the works of Plato and Aristotle,[7] and the idea of a deductive system was probably known in the Pythagorean school and the Platonic Academy.[4]
Separately from geometry, the idea of a standard argument pattern is found in the Reductio ad absurdum used by Zeno of Elea, a pre-Socratic philosopher of the fifth century BC. This is the technique of drawing an obviously false, absurd or impossible conclusion from an assumption, thus demonstrating that the assumption is false.[8] Plato's Parmenides portrays Zeno as claiming to have written a book defending the monism of Parmenides by demonstrating the absurd consequence of assuming that there is plurality. Other philosophers who practised such dialectic reasoning were the so-called minor Socratics, including Euclid of Megara, who were probably followers of Parmenides and Zeno. The members of this school were called "dialecticians" (from a Greek word meaning "to discuss").
Further evidence that pre-Aristotelian thinkers were concerned with the principles of reasoning is found in the fragment called Dissoi Logoi, probably written at the beginning of the fourth century BC. This is part of a protracted debate about truth and falsity.[9]
None of the surviving works of the great fourth-century philosopher Plato (428–347) include any formal logic,[10] but they include important contributions to the field of philosophical logic. Plato raises three questions:
The first question arises in the dialogue Theaetetus, where Plato identifies thought or opinion with talk or discourse (logos).[11] The second question is a result of Plato's theory of Forms. Forms are not things in the ordinary sense, nor strictly ideas in the mind, but they correspond to what philosophers later called universals, namely an abstract entity common to each set of things that have the same name. In both The Republic and The Sophist, Plato suggests that the necessary connection between the premisses and the conclusion of an argument corresponds to a necessary connection between "forms".[12] The third question is about definition. Many of Plato's dialogues concern the search for a definition of some important concept (justice, truth, the Good), and it is likely that Plato was impressed by the importance of definition in mathematics.[13] What underlies every definition is a Platonic Form, the common nature present in different particular things. Thus a definition reflects the ultimate object of our understanding, and is the foundation of all valid inference. This had a great influence on Aristotle, in particular Aristotle's notion of the essence of a thing, the "what it is to be" a particular thing of a certain kind.[14]
The logic of Aristotle, and particularly his theory of the syllogism, has had an enormous influence in Western thought.[15] His logical works, called the Organon, are the earliest formal study of logic that have come down to modern times. Though it is difficult to determine the dates, the probable order of writing of Aristotle's logical works is:
These works are of outstanding importance in the history of logic. Aristotle was the first logician to attempt a systematic analysis of logical syntax, into noun (or term), and verb. In the Categories, he attempted to classify all the possible things that a term can refer to. This idea underpins his philosophical work, the Metaphysics, which also had a profound influence on Western thought. He was the first to deal with the principles of contradiction and excluded middle in a systematic way. He was the first formal logician (i.e. he gave the principles of reasoning using variables to show the underlying logical form of arguments). He was looking for relations of dependence which characterise necessary inference, and distinguished the validity of these relations, from the truth of the premises (the soundness of the argument). The Prior Analytics contains his exposition of the "syllogistic", where three important principles are applied for the first time in history: the use of variables, a purely formal treatment, and the use of an axiomatic system. In the Topics and Sophistical Refutations he also developed a theory of non-formal logic (e.g. the theory of fallacies).[16]
The other great school of Greek logic is that of the Stoics.[17] Stoic logic traces its roots back to the late 5th century BC philosopher, Euclid of Megara, a pupil of Socrates and slightly older contemporary of Plato. His pupils and successors were called "Megarians", or "Eristics", and later the "Dialecticians". The two most important dialecticians of the Megarian school were Diodorus Cronus and Philo who were active in the late 4th century BC. The Stoics adopted the Megarian logic and systemized it. The most important member of the school was Chrysippus (c. 278–c. 206 BC), who was its third head, and who formalized much of Stoic doctrine. He is supposed to have written over 700 works, almost none of which survive.[18] Unlike with Aristotle, we have no complete works by the Megarians or the early Stoics, and have to rely on accounts (sometimes hostile) by Sextus Empiricus, writing in the 3rd century AD.
Three significant contributions of the Stoic school were (i) their account of modality, (ii) their theory of the Material conditional, and (iii) their account of meaning and truth.[19]
Formal logic began independently in ancient India and continued to develop through to early modern times, without any known influence from Greek logic.[27] Medhatithi Gautama (c. 6th century BCE) founded the anviksiki school of logic.[28] The Mahabharata (12.173.45), around the 5th century BCE, refers to the anviksiki and tarka schools of logic. Pāṇini (c. 5th century BCE) developed a form of logic (to which Boolean logic has some similarities) for his formulation of Sanskrit grammar. Logic is described by Chanakya (c. 350-283 BCE) in his Arthashastra as an independent field of inquiry anviksiki.[29]
Two of the six Indian schools of thought deal with logic: Nyaya and Vaisheshika. The Nyaya Sutras of Aksapada Gautama (c. 2nd century CE) constitute the core texts of the Nyaya school, one of the six orthodox schools of Hindu philosophy. This realist school developed a rigid five-member schema of inference involving an initial premise, a reason, an example, an application and a conclusion.[30] The idealist Buddhist philosophy became the chief opponent to the Naiyayikas. Nagarjuna (c. 150-250 CE), the founder of the Madhyamika ("Middle Way") developed an analysis known as the catuskoti (Sanskrit). This four-cornered argumentation systematically examined and rejected the affirmation of a proposition, its denial, the joint affirmation and denial, and finally, the rejection of its affirmation and denial. But it was with Dignaga (c 480-540 CE), who developed a formal syllogistic,[31] and his successor Dharmakirti that Buddhist logic reached its height. Their analysis centered on the definition of necessary logical entailment, "vyapti", also known as invariable concomitance or pervasion.[32] To this end a doctrine known as "apoha" or differentiation was developed.[33] This involved what might be called inclusion and exclusion of defining properties.
The difficulties involved in this enterprise, in part, stimulated the neo-scholastic school of Navya-Nyāya, which developed a formal analysis of inference in the sixteenth century. This later school began around eastern India and Bengal, and developed theories resembling modern logic, such as Gottlob Frege's "distinction between sense and reference of proper names" and his "definition of number," as well as the Navya-Nyaya theory of "restrictive conditions for universals" anticipating some of the developments in modern set theory.[34] Since 1824, Indian logic attracted the attention of many Western scholars, and has had an influence on important 19th-century logicians such as Charles Babbage, Augustus De Morgan, and particularly George Boole, as confirmed by his wife Mary Everest Boole who wrote an article entitled "Indian Thought and Western Science in the Nineteenth Century" in 1901.[35]
In China, a contemporary of Confucius, Mozi, "Master Mo", is credited with founding the Mohist school, whose canons dealt with issues relating to valid inference and the conditions of correct conclusions. In particular, one of the schools that grew out of Mohism, the Logicians, are credited by some scholars for their early investigation of formal logic. Unfortunately, due to the harsh rule of Legalism in the subsequent Qin Dynasty, this line of investigation disappeared in China until the introduction of Indian philosophy by Buddhists.
The works of Al-Farabi, Avicenna, Al-Ghazali, Averroes and other Muslim logicians both criticized and developed Aristotelian logic and were important in communicating the ideas of the ancient world to the medieval West.[36] Al-Farabi (Alfarabi) (873–950) was an Aristotelian logician who discussed the topics of future contingents, the number and relation of the categories, the relation between logic and grammar, and non-Aristotelian forms of inference.[37] Al-Farabi also considered the theories of conditional syllogisms and analogical inference, which were part of the Stoic tradition of logic rather than the Aristotelian.[38]
Ibn Sina (Avicenna) (980–1037) was the founder of Avicennian logic, which replaced Aristotelian logic as the domininant system of logic in the Islamic world,[39] and also had an important influence on Western medieval writers such as Albertus Magnus.[40] Avicenna wrote on the hypothetical syllogism[41] and on the propositional calculus, which were both part of the Stoic logical tradition.[42] He developed an original theory of “temporally modalized” syllogistic[37] and made use of inductive logic, such as the methods of agreement, difference and concomitant variation which are critical to the scientific method.[41] One of Avicenna's ideas had a particularly important influence on Western logicians such as William of Ockham. Avicenna's word for a meaning or notion (ma'na), was translated by the scholastic logicians as the Latin intentio. In medieval logic and epistemology, this is a sign in the mind that naturally represents a thing.[43] This was crucial to the development of Ockham's conceptualism. A universal term (e.g. "man") does not signify a thing existing in reality, but rather a sign in the mind (intentio in intellectu) which represents many things in reality. Ockham cites Avicenna's commentary on Metaphysics V in support of this view.[44]
Fakhr al-Din al-Razi (b. 1149) criticised Aristotle's "first figure" and formulated an early system of inductive logic, foreshadowing the system of inductive logic developed by John Stuart Mill (1806–1873).[45] Al-Razi's work was seen by later Islamic scholars as marking a new direction for Islamic logic, towards a Post-Avicennian logic. This was further elaborated by his student Afdaladdîn al-Khûnajî (d. 1249), who developed a form of logic revolving around the subject matter of conceptions and assents. In response to this tradition, Nasir al-Din al-Tusi (1201–1274) began a tradition of Neo-Avicennian logic which remained faithful to Avicenna's work and existed as an alternative to the more dominant Post-Avicennian school over the following centuries.[46]
Najm al-Dīn al-Qazwīnī al-Kātibī (d. 1276), a student of al-Tusi,[47] was the author of a work on logic, Al-Risāla al-Shamsiyya[48] (Logic for Shams al-Dīn), that was commonly used as the first major text on logic in Sunni madrasahs, right down until the 20th century and is "perhaps the most studied logic textbook of all time."[49] Al-Qazwīnī's logic was largely inspired by the Avicennia's formal system of temporal modal logic, but is more elaborate and departs from it in several ways. While Avicenna considered ten modalities and examined six of them, Al-Qazwlni considers many more modalized propositions and examines thirteen which he considers 'customary to investigate'.[50]
Systematic refutations of Greek logic were written by the Illuminationist school, founded by Shahab al-Din Suhrawardi (1155–1191), who developed the idea of "decisive necessity", which refers to the reduction of all modalities (necessity, possibility, contingency and impossibility) to the single mode of necessity.[51] Ibn al-Nafis (1213–1288) wrote a book on Avicennian logic, which was a commentary of Avicenna's Al-Isharat (The Signs) and Al-Hidayah (The Guidance).[52] Another systematic refutation of Greek logic was written by Ibn Taymiyyah (1263–1328), the Ar-Radd 'ala al-Mantiqiyyin (Refutation of Greek Logicians), where he argued against the usefulness, though not the validity, of the syllogism[53] and in favour of inductive reasoning.[45] Ibn Taymiyyah also argued against the certainty of syllogistic arguments and in favour of analogy. His argument is that concepts founded on induction are themselves not certain but only probable, and thus a syllogism based on such concepts is no more certain than an argument based on analogy. He further claimed that induction itself is founded on a process of analogy. His model of analogical reasoning was based on that of juridical arguments.[54][55] This model of analogy has been used in the recent work of John F. Sowa.[55]
The Sharh al-takmil fi'l-mantiq written by Muhammad ibn Fayd Allah ibn Muhammad Amin al-Sharwani in the 15th century is the last major Arabic work on logic that has been studied.[56] However, "thousands upon thousands of pages" on logic were written between the 14th and 19th centuries, though only a fraction of the texts written during this period have been studied by historians, hence little is known about the original work on Islamic logic produced during this later period.[46]
"Medieval logic" (also known as "Scholastic logic") generally means the form of Aristotelian logic developed in medieval Europe throughout the period c 1200–1600.[57] For centuries after Stoic logic had been formulated, it was the dominant system of logic in the classical world. When the study of logic resumed after the Dark Ages, the main source was the work of the Christian philosopher Boethius, who was familiar with some of Aristotle's logic, but almost none of the work of the Stoics.[58] Until the twelfth century the only works of Aristotle available in the West were the Categories, On Interpretation and Boethius' translation of the Isagoge of Porphyry (a commentary on the Categories). These works were known as the "Old Logic" (Logica Vetus or Ars Vetus). An important work in this tradition was the Logica Ingredientibus of Peter Abelard (1079–1142). His direct influence was small,[59] but his influence through pupils such as John of Salisbury was great, and his method of applying rigorous logical analysis to theology shaped the way that theological criticism developed in the period that followed.[60]
By the early thirteenth century the remaining works of Aristotle's Organon (including the Prior Analytics, Posterior Analytics and the Sophistical Refutations) had been recovered in the West.[61] Logical work until then was mostly paraphrasis or commentary on the work of Aristotle.[62] The period from the middle of the thirteenth to the middle of the fourteenth century was one of significant developments in logic, particularly in three areas which were original, with little foundation in the Aristotelian tradition that came before. These were:[63]
The last great works in this tradition are the Logic of John Poinsot (1589–1644, known as John of St Thomas), and the Metaphysical Disputations of Francisco Suarez (1548–1617).
"Traditional Logic" generally means the textbook tradition that begins with Antoine Arnauld and Pierre Nicole's Logic, or the Art of Thinking, better known as the Port-Royal Logic.[68] Published in 1662, it was the most influential work on logic in England until the nineteenth century.[69] The book presents a loosely Cartesian doctrine (that the proposition is a combining of ideas rather than terms, for example) within a framework that is broadly derived from Aristotelian and medieval term logic. Between 1664 and 1700 there were eight editions, and the book had considerable influence after that.[69] The account of propositions that Locke gives in the Essay is essentially that of Port-Royal: "Verbal propositions, which are words, [are] the signs of our ideas, put together or separated in affirmative or negative sentences. So that proposition consists in the putting together or separating these signs, according as the things which they stand for agree or disagree." (Locke, An Essay Concerning Human Understanding, IV. 5. 6)
Another influential work was the Novum Organum by Francis Bacon, published in 1620. The title translates as "new instrument". This is a reference to Aristotle's work Organon. In this work, Bacon rejected the syllogistic method of Aristotle in favour of an alternative procedure "which by slow and faithful toil gathers information from things and brings it into understanding".[70] This method is known as inductive reasoning. The inductive method starts from empirical observation and proceeds to lower axioms or propositions. From the lower axioms more general ones can be derived (by induction). In finding the cause of a phenomenal nature such as heat, one must list all of the situations where heat is found. Then another list should be drawn up, listing situations that are similar to those of the first list except for the lack of heat. A third table lists situations where heat can vary. The form nature, or cause, of heat must be that which is common to all instances in the first table, is lacking from all instances of the second table and varies by degree in instances of the third table.
Other works in the textbook tradition include Isaac Watts' Logick: Or, the Right Use of Reason (1725), Richard Whately's Logic (1826), and John Stuart Mill's A System of Logic (1843). Although the latter was one of the last great works in the tradition, Mill's view that the foundations of logic lay in introspection[71] influenced the view that logic is best understood as a branch of psychology, an approach to the subject which dominated the next fifty years of its development, especially in Germany.[72]
G.W.F. Hegel indicated the importance of logic to his philosophical system when he condensed his extensive Science of Logic into a shorter work published in 1817 as the first volume of his Encyclopaedia of the Philosophical Sciences. The "Shorter" or "Encyclopaedia" Logic, as it is often known, lays out a series of transitions which leads from the most empty and abstract of categories – Hegel begins with "Pure Being" and "Pure Nothing" – to the "Absolute" – the category which contains and resolves all the categories which preceded it. Despite the title, Hegel's Logic is not really a contribution to the science of valid inference. Rather than deriving conclusions about concepts through valid inference from premises, Hegel seeks to show that thinking about one concept compels thinking about another concept (one cannot, he argues, possess the concept of "Quality" without the concept of "Quantity"); and the compulsion here is not a matter of individual psychology, but arises almost organically from the content of the concepts themselves. His purpose is to show the rational structure of the "Absolute" – indeed of rationality itself. The method by which thought is driven from one concept to its contrary, and then to further concepts, is known as the Hegelian dialectic.
Although Hegel's Logic has had little impact on mainstream logical studies, its influence can be seen in the work of the British Idealists – for example in F.H. Bradley's Principles of Logic (1883) – and in the economic, political and philosophical studies of Karl Marx and the various schools of Marxism.
Between the work of Mill and Frege stretched half a century during which logic was widely treated as a descriptive science, an empirical study of the structure of reasoning, and thus essentially as a branch of psychology.[73] The German psychologist Wilhelm Wundt, for example, discussed deriving "the logical from the psychological laws of thought", emphasizing that "psychological thinking is always the more comprehensive form of thinking."[74] This view was widespread among German philosophers of the period: Theodor Lipps described logic as "a specific discipline of psychology";[75] Christoph von Sigwart understood logical necessity as grounded in the individual's compulsion to think in a certain way;[76] and Benno Erdmann argued that "logical laws only hold within the limits of our thinking"[77] Such was the dominant view of logic in the years following Mill's work.[78] This psychological approach to logic was rejected by Gottlob Frege. It was also subjected to an extended and destructive critique by Edmund Husserl in the first volume of his Logical Investigations (1900), an assault which has been described as "overwhelming".[79] Husserl argued forcefully that grounding logic in psychological observations implied that all logical truths remained unproven, and that skepticism and relativism were unavoidable consequences.
Such criticisms did not immediately extirpate so-called "psychologism". For example, the American philosopher Josiah Royce, while acknowledging the force of Husserl's critique, remained "unable to doubt" that progress in psychology would be accompanied by progress in logic, and vice versa.[80]
The period between the fourteenth century and the beginning of the nineteenth century had been largely one of decline and neglect, and is generally regarded as barren by historians of logic.[1] The revival of logic occurred in the mid-nineteenth century, at the beginning of a revolutionary period where the subject developed into a rigorous and formalistic discipline whose exemplar was the exact method of proof used in mathematics. The development of the modern so-called "symbolic" or "mathematical" logic during this period is the most significant in the 2,000-year history of logic, and is arguably one of the most important and remarkable events in human intellectual history.[2]
A number of features distinguish modern logic from the old Aristotelian or traditional logic, the most important of which are as follows:[81] Modern logic is fundamentally a calculus whose rules of operation are determined only by the shape and not by the meaning of the symbols it employs, as in mathematics. Many logicians were impressed by the "success" of mathematics, in that there has been no prolonged dispute about any properly mathematical result. C.S. Peirce noted[82] that even though a mistake in the evaluation of a definite integral by Laplace led to an error concerning the moon's orbit that persisted for nearly 50 years, the mistake, once spotted, was corrected without any serious dispute. Peirce contrasted this with the disputation and uncertainty surrounding traditional logic, and especially reasoning in metaphysics. He argued that a truly "exact" logic would depend upon mathematical, i.e., "diagrammatic" or "iconic" thought. "Those who follow such methods will ... escape all error except such as will be speedily corrected after it is once suspected". Modern logic is also "constructive" rather than "abstractive"; i.e., rather than abstracting and formalising theorems derived from ordinary language (or from psychological intuitions about validity), it constructs theorems by formal methods, then looks for an interpretation in ordinary language. It is entirely symbolic, meaning that even the logical constants (which the medieval logicians called "syncategoremata") and the categoric terms are expressed in symbols. Finally, modern logic strictly avoids psychological, epistemological and metaphysical questions.[83]
The development of modern logic falls into roughly five periods:[84]
The idea that inference could be represented by a purely mechanical process is found as early as Raymond Lull, who proposed a (somewhat eccentric) method of drawing conclusions by a system of concentric rings. Three hundred years later, the English philosopher and logician Thomas Hobbes suggested that all logic and reasoning could be reduced to the mathematical operations of addition and subtraction.[86] The same idea is found in the work of Leibniz, who had read both Lull and Hobbes, and who argued that logic can be represented through a combinatorial process or calculus. But, like Lull and Hobbes, he failed to develop a detailed or comprehensive system, and his work on this topic was not published until long after his death. Leibniz says that ordinary languages are subject to "countless ambiguities" and are unsuited for a calculus, whose task is to expose mistakes in inference arising from the forms and structures of words;[87] hence, he proposed to identify an alphabet of human thought comprising fundamental concepts which could be composed to express complex ideas,[88] and create a calculus ratiocinator which would make reasoning "as tangible as those of the Mathematicians, so that we can find our error at a glance, and when there are disputes among persons, we can simply say: Let us calculate."[89]
Gergonne (1816) says that reasoning does not have to be about objects about which we have perfectly clear ideas, since algebraic operations can be carried out without our having any idea of the meaning of the symbols involved.[90] Bolzano anticipated a fundamental idea of modern proof theory when he defined logical consequence or "deducibility" in terms of variables: a set of propositions n, o, p ... are deducible from propositions a, b, c ... in respect of the variables i, j, ... if any substitution for i, j that have the effect of making a, b, c ... true, simultaneously make the propositions n, o, p ... also.[91] This is now known as semantic validity.
Modern logic begins with the so-called "algebraic school", originating with Boole and including Peirce, Jevons, Schröder and Venn.[92] Their objective was to develop a calculus to formalise reasoning in the area of classes, propositions and probabilities. The school begins with Boole's seminal work Mathematical Analysis of Logic which appeared in 1847, although De Morgan (1847) is its immediate precursor.[93] The fundamental idea of Boole's system is that algebraic formulae can be used to express logical relations. This idea occurred to Boole in his teenage years, working as an usher in a private school in Lincoln, Lincolnshire.[94] For example, let x and y stand for classes let the symbol = signify that the classes have the same members, xy stand for the class containing all and only the members of x and y and so on. Boole calls these elective symbols, i.e. symbols which select certain objects for consideration.[95] An expression in which elective symbols are used is called an elective function, and an equation of which the members are elective functions, is an elective equation.[96] The theory of elective functions and their "development" is essentially the modern idea of truth-functions and their expression in disjunctive normal form.[95]
Boole's system admits of two interpretations, in class logic, and propositional logic. Boole distinguished between "primary propositions" which are the subject of syllogistic theory, and "secondary propositions", which are the subject of propositional logic, and showed how under different "interpretations" the same algebraic system could represent both. An example of a primary proposition is "All inhabitants are either Europeans or Asiatics." An example of a secondary proposition is "Either all inhabitants are Europeans or they are all Asiatics."[97] These are easily distinguished in modern propositional calculus, where it is also possible to show that the first follows from the second, but it is a significant disadvantage that there is no way of representing this in the Boolean system.[98]
In his Symbolic Logic (1881), John Venn used diagrams of overlapping areas to express Boolean relations between classes or truth-conditions of propositions. In 1869 Jevons realised that Boole's methods could be mechanised, and constructed a "logical machine" which he showed to the Royal Society the following year.[95] In 1885 Allan Marquand proposed an electrical version of the machine that is still extant (picture at the Firestone Library).
The defects in Boole's system (such as the use of the letter v for existential propositions) were all remedied by his followers. Jevons published Pure Logic, or the Logic of Quality apart from Quantity in 1864, where he suggested a symbol to signify exclusive or, which allowed Boole's system to be greatly simplified.[99] This was usefully exploited by Schröder when he set out theorems in parallel columns in his Vorlesungen (1890–1905). Peirce (1880) showed how all the Boolean elective functions could be expressed by the use of a single primitive "either ... or", however, like many of Peirce's innovations, this passed without notice until Sheffer rediscovered it in 1913.[100] Boole's early work also lacks the idea of the logical sum which originates in Peirce (1867), Schröder (1877) and Jevons (1890),[101] and the concept of inclusion, first suggested by Gergonne (1816) and clearly articulated by Peirce (1870).
The success of Boole's algebraic system suggested that all logic must be capable of algebraic representation, and there were attempts to express a logic of relations in such form, of which the most ambitious was Schröder's monumental Vorlesungen über die Algebra der Logik ("Lectures on the Algebra of Logic", vol iii 1895), although the original idea was again anticipated by Peirce.[102]
After Boole, the next great advances were made by the German mathematician Gottlob Frege. Frege's objective was the program of Logicism, i.e. demonstrating that arithmetic is identical with logic.[103] Frege went much further than any of his predecessors in his rigorous and formal approach to logic, and his calculus or Begriffsschrift is considered the greatest single achievement in the entire history of logic.[103] Frege also tried to show that the concept of number can be defined by purely logical means, so that (if he was right) logic includes arithmetic and all branches of mathematics that are reducible to arithmetic. He was not the first writer to suggest this. In his pioneering work Die Grundlagen der Arithmetik (The Foundations of Arithmetic), sections 15–17, he acknowledges the efforts of Leibniz, J.S. Mill as well as Jevons, citing the latter's claim that "algebra is a highly developed logic, and number but logical discrimination."[104]
Frege's first work, the Begriffsschrift ("concept script") is a rigorously axiomatised system of propositional logic, relying on just two connectives (negational and conditional), two rules of inference (modus ponens and substitution), and six axioms. Frege referred to the "completeness" of this system, but was unable to prove this.[105] The most signification innovation, however, was his explanation of the quantifier in terms of mathematical functions. Traditional logic regards the sentence "Caesar is a man" as of fundamentally the same form as "all men are mortal." Sentences with a proper name subject were regarded as universal in character, interpretable as "every Caesar is a man".[106] Frege argued that the quantifier expression "all men" does not have the same logical or semantic form as "all men", and that the universal proposition "every A is B" is a complex proposition involving two functions, namely ' – is A' and ' – is B' such that whatever satisfies the first, also satisfies the second. In modern notation, this would be expressed as
In English, "for all x, if Ax then Bx". Thus only singular propositions are of subject-predicate form, and they are irreducibly singular, i.e. not reducible to a general proposition. Universal and particular propositions, by contrast, are not of simple subject-predicate form at all. If "all mammals" were the logical subject of the sentence "all mammals are land-dwellers", then to negate the whole sentence we would have to negate the predicate to give "all mammals are not land-dwellers". But this is not the case.[107] This functional analysis of ordinary-language sentences later had a great impact on philosophy and linguistics.
This means that in Frege's calculus, Boole's "primary" propositions can be represented in a different way from "secondary" propositions. "All inhabitants are either Europeans or Asiatics" is
whereas "All the inhabitants are Europeans or all the inhabitants are Asiatics" is
As Frege remarked in a critique of Boole's calculus:
As well as providing a unified and comprehensive system of logic, Frege's calculus also resolved the ancient problem of multiple generality. The ambiguity of "every girl kissed a boy" is difficult to express in traditional logic, but Frege's logic captures this through the different scope of the quantifiers. Thus
means that to every girl there corresponds some boy (any one will do) who the girl kissed. But
means that there is some particular boy whom every girl kissed. Without this device, the project of logicism would have been doubtful or impossible. Using it, Frege provided a definition of the ancestral relation, of the many-to-one relation, and of mathematical induction.[109]
This period overlaps with the work of the so-called "mathematical school", which included Dedekind, Pasch, Peano, Hilbert, Zermelo, Huntington, Veblen and Heyting. Their objective was the axiomatisation of branches of mathematics like geometry, arithmetic, analysis and set theory.
The logicist project received a near-fatal setback with the discovery of a paradox in 1901 by Bertrand Russell. This proved that the Frege's naive set theory led to a contradiction. Frege's theory is that for any formal criterion, there is a set of all objects that meet the criterion. Russell showed that a set containing exactly the sets that are not members of themselves would contradict its own definition (if it is not a member of itself, it is a member of itself, and if it is a member of itself, it is not).[110] This contradiction is now known as Russell's paradox. One important method of resolving this paradox was proposed by Ernst Zermelo.[111] Zermelo set theory was the first axiomatic set theory. It was developed into the now-canonical Zermelo–Fraenkel set theory (ZF).
The monumental Principia Mathematica, a three-volume work on the foundations of mathematics, written by Russell and Alfred North Whitehead and published 1910–13 also included an attempt to resolve the paradox, by means of an elaborate system of types: a set of elements is of a different type than is each of its elements (set is not the element; one element is not the set) and one cannot speak of the "set of all sets". The Principia was an attempt to derive all mathematical truths from a well-defined set of axioms and inference rules in symbolic logic.
The names of Gödel and Tarski dominate the 1930s,[112] a crucial period in the development of metamathematics – the study of mathematics using mathematical methods to produce metatheories, or mathematical theories about other mathematical theories. Early investigations into metamathematics had been driven by Hilbert's program. which sought to resolve the ongoing crisis in the foundations of mathematics by grounding all of mathematics to a finite set of axioms, proving its consistency by "finitistic" means and providing a procedure which would decide the truth or falsity of any mathematical statement. Work on metamathematics culminated in the work of Gödel, who in 1929 showed that a given first-order sentence is deducible if and only if is logically valid – i.e. it is true in every structure for its language. This is known as Gödel's completeness theorem. A year later, he proved two important theorems, which showed Hibert's program to be unattainable in its original form. The first is that no consistent system of axioms whose theorems can be listed by an effective procedure such as an algorithm or computer program is capable of proving all facts about the natural numbers. For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second is that if such a system is also capable of proving certain basic facts about the natural numbers, then the system cannot prove the consistency of the system itself. These two results are known as Gödel's incompleteness theorems, or simply Gödel's Theorem. Later in the decade, Gödel developed the concept of set-theoretic constructibility, as part of his proof that the axiom of choice and the continuum hypothesis are consistent with Zermelo-Fraenkel set theory.
In proof theory, Gerhard Gentzen developed natural deduction and the sequent calculus. The former attempts to model logical reasoning as it 'naturally' occurs in practice and is most easily applied to intuitionistic logic, while the latter was devised to clarify the derivation of logical proofs in any formal system. Since Gentzen's work, natural deduction and sequent calculi have been widely applied in the fields of proof theory, mathematical logic and computer science. Gentzen also proved normalization and cut-elimination theorems for intuitionistic and classical logic which could be used to reduce logical proofs to a normal form.[113][114]
Alfred Tarski, a pupil of Łukasiewicz, is best known for his definition of truth and logical consequence, and the semantic concept of logical satisfaction. In 1933, he published (in Polish) The concept of truth in formalized languages, in which he proposed his semantic theory of truth: a sentence such as "snow is white" is true if and only if snow is white. Tarski's theory separated the metalanguage, which makes the statement about truth, from the object language, which contains the sentence whose truth is being asserted, and gave a correspondence (the T-schema) between phrases in the object language and elements of an interpretation. Tarski's approach to the difficult idea of explaining truth has been enduringly influential in logic and philosophy, especially in the development of model theory.[115] Tarski also produced important work on the methodology of deductive systems, and on fundamental principles such as completeness, decidability, consistency and definability. According to Anita Feferman, Tarski "changed the face of logic in the twentieth century".[116]
Alonzo Church and Alan Turing proposed formal models of computability, giving independent negative solutions to Hilbert's Entscheidungsproblem in 1936 and 1937, respectively. The Entcheidungsproblem asked for a procedure that, given any formal mathematical statement, would algorithmically determine whether the statement is true. Church and Turing proved there is no such procedure; Turing's paper introduced the halting problem as a key example of a mathematical problem without an algorithmic solution.
Church's system for computation developed into the modern λ-calculus, while the Turing machine became a standard model for a general-purpose computing device. It was soon shown that many other proposed models of computation were equivalent in power to those proposed by Church and Turing. These results led to the Church–Turing thesis that any deterministic algorithm that can be carried out by a human can be carried out by a Turing machine. Church proved additional undecidability results, showing that both Peano arithmetic and first-order logic are undecidable. Later work by Emil Post and Stephen Cole Kleene in the 1940s extended the scope of computability theory and introduced the concept of degrees of unsolvability.
The results of the first few decades of the twentieth century also had an impact upon analytic philosophy and philosophical logic, particularly from the 1950s onwards, in subjects such as modal logic, temporal logic, deontic logic, and relevance logic.
After World War II, mathematical logic branched into four inter-related but separate areas of research: model theory, proof theory, computability theory, and set theory.[117]
In set theory, the method of forcing revolutionized the field by providing a robust method for constructing models and obtaining independence results. Paul Cohen introduced this method in 1962 to prove the independence of the continuum hypothesis and the axiom of choice from Zermelo–Fraenkel set theory.[118] His technique, which was simplified and extended soon after its introduction, has since been applied to many other problems in all areas of mathematical logic.
Computability theory had its roots in the work of Turing, Church, Kleene, and Post in the 1930s and 40s. It developed into a study of abstract computability, which became known as recursion theory.[119] The priority method, discovered independently by Albert Muchnik and Richard Friedberg in the 1950s, led to major advances in the understanding of the degrees of unsolvability and related structures. Research into higher-order computability theory demonstrated its connections to set theory. The fields of constructive analysis and computable analysis were developed to study the effective content of classical mathematical theorems; these in turn inspired the program of reverse mathematics. A separate branch of computability theory, computational complexity theory, was also characterized in logical terms as a result of investigations into descriptive complexity.
Model theory applies the methods of mathematical logic to study models of particular mathematical theories. Alfred Tarski published much pioneering work in the field, which is named after a series of papers he published under the title Contributions to the theory of models. In the 1960s, Abraham Robinson used model-theoretic techniques to develop calculus and analysis based on infinitesimals, a problem that first had been proposed by Leibniz.
In proof theory, the relationship between classical mathematics and intuitionistic mathematics was clarified via tools such as the realizability method invented by Georg Kreisel and Gödel's Dialectica interpretation. This work inspired the contemporary area of proof mining. The Curry-Howard correspondence emerged as a deep analogy between logic and computation, including a correspondence between systems of natural deduction and typed lambda calculi used in computer science. As a result, research into this class of formal systems began to address both logical and computational aspects; this area of research came to be known as modern type theory. Advances were also made in ordinal analysis and the study of independence results in arithmetic such as the Paris–Harrington theorem.
This was also a period, particularly in the 1950s and afterwards, when the ideas of mathematical logic begin to influence philosophical thinking. For example, tense logic is a formalised system for representing, and reasoning about, propositions qualified in terms of time. The philosopher Arthur Prior played a significant role in its development in the 1960s. Modal logics extend the scope of formal logic to include the elements of modality (for example, possibility and necessity). The ideas of Saul Kripke, particularly about possible worlds, and the formal system now called Kripke semantics have had a profound impact on analytic philosophy..[120] His best known and most influential work is Naming and Necessity (1980)[121] Deontic logics are closely related to modal logics: they attempt to capture the logical features of obligation, permission and related concepts. Ernst Mally, a pupil of Alexius Meinong, was the first to propose a formal deontic system in his Grundgesetze des Sollens, based on the syntax of Whitehead's and Russell's propositional calculus. Another logical system founded after World War II was fuzzy logic by Iranian mathematician Lotfi Asker Zadeh in 1965.
|